909 research outputs found
Recommended from our members
Improving DBMS performance through diverse redundancy
Database replication is widely used to improve both fault tolerance and DBMS performance. Non-diverse database replication has a significant limitation - it is effective against crash failures only. Diverse redundancy is an effective mechanism of tolerating a wider range of failures, including many non-crash failures. However it has not been adopted in practice because many see DBMS performance as the main concern. In this paper we show experimental evidence that diverse redundancy (diverse replication) can bring benefits in terms of DBMS performance, too. We report on experimental results with an optimistic architecture built with two diverse DBMSs under a load derived from TPC-C benchmark, which show that a diverse pair performs faster not only than non-diverse pairs but also than the individual copies of the DBMSs used. This result is important because it shows potential for DBMS performance better than anything achievable with the available off-the-shelf servers
Recommended from our members
A survey on online monitoring approaches of computer-based systems
This report surveys forms of online data collection that are in current use (as well as being the subject of research to adapt them to changing technology and demands), and can be used as inputs to assessment of dependability and resilience, although they are not primarily meant for this use
Recommended from our members
An Empirical Study of the Effectiveness of 'Forcing Diversity' Based on a Large Population of Diverse Programs
Use of diverse software components is a viable defence against common-mode failures in redundant softwarebased systems. Various forms of "Diversity-Seeking Decisions" (âDSDsâ) can be applied to the process of developing, or procuring, redundant components, to improve the chances of the resulting components not failing on the same demands. An open question is how effective these decisions, and their combinations, are for achieving large enough reliability gains. Using a large population of software programs, we studied experimentally the effectiveness of specific "DSDs" (and their combinations) mandating differences between redundant components. Some of these combinations produced much better improvements in system probability of failure per demand (PFD) than "uncontrolled" diversity did. Yet, our findings suggest that the gains from such "DSDs" vary significantly between them and between the application problems studied. The relationship between DSDs and system PFD is complex and does not allow for simple universal rules
(e.g. "the more diversity the better") to apply
Compressive sampling of binary images
Compressive sampling is a novel framework that exploits sparsity of a signal in a transform domain to perform sampling below the Nyquist rate. In this paper, we apply compressive sampling to reduce the sampling rate of binary images. A system is proposed whereby the image is split into non-overlapping blocks of equal size and compressive sampling is performed on selected blocks only using the orthogonal matching pursuit technique. The remaining blocks are sampled fully. This way, the complexity and the required sampling time is reduced since the orthogonal matching pursuit operates on a smaller number of samples, and at the same time local sparsity within an image is exploited. Our simulation results show more than 20% saving in acquisition for several binary images
Recommended from our members
Diversity with AntiVirus products: Additional empirical studies
In this paper we describe the design of a new set of empirical studies we will run to test the gains in detection capabilities from using diverse AntiVirus products. This new work builds on previous work on this topic reported in [1, 2, 3]. We describe the motivation for this work, how it extends the previous work and what studies we will conduct
On practical design for joint distributed source and network coding
This paper considers the problem of communicating correlated information from multiple source nodes over a network of noiseless channels to multiple destination nodes, where each destination node wants to recover all sources. The problem involves a joint consideration of distributed compression and network information relaying. Although the optimal rate region has been theoretically characterized, it was not clear how to design practical communication schemes with low complexity. This work provides a partial solution to this problem by proposing a low-complexity scheme for the special case with two sources whose correlation is characterized by a binary symmetric channel. Our scheme is based on a careful combination of linear syndrome-based Slepian-Wolf coding and random linear mixing (network coding). It is in general suboptimal; however, its low complexity and robustness to network dynamics make it suitable for practical implementation
Metal Removal from Effluents by Electrowinning and a new Design Concept in Wastewater Purification Technology
In recent years there has been an increased interest in finding new and innovative solutions for efficient metal removal from effluents. Electrowinning has particularly been considered as a way for efficient solution of the water and soil pollution problems. Electrochemical
cells, designed to operate with effluents at low concentrations, require special provisions for enhancement of mass transport to the electrode surface. Different concepts for doing this are critically reviewed. The various types of cells are described and compared and some advantages and disadvantages are discussed. Particular attention has
been paid to those effluents not suitable to be treated by the electrowinning method. Pertraction as an emerging technology, suitable for separating and concentrating heavy metal ions from very dilute solutions is described and considered as a way to be coupled
with electrowinning for heavy metal removal. The proposed process offers many advantages over the existing technologies for cleaning wastewater from heavy metals. A comprehensive literature survey of the electrochemical reactors as well as of supported liquid
membrane technique is also given
Recommended from our members
An Experimental Study of Diversity with Off-The-Shelf AntiVirus Engines
Fault tolerance in the form of diverse redundancy is well known to improve the detection rates for both malicious and non-malicious failures. What is of interest to designers of security protection systems are the actual gains in detection rates that they may give. In this paper we provide exploratory analysis of the potential gains in detection capability from using diverse AntiVirus products for the detection of self-propagating malware. The analysis is based on 1599 malware samples collected by the operation of a distributed honeypot deployment over a period of 178 days. We sent these samples to the signature engines of 32 different AntiVirus products taking advantage of the VirusTotal service. The resulting dataset allowed us to perform analysis of the effects of diversity on the detection capability of these components as well as how their detection capability evolves in time
Measuring the energy intensity of domestic activities from smart meter data
Household electricity consumption can be broken down to appliance end-use through a variety of methods such as modelling, sub-metering, load disaggregation or non-intrusive appliance load monitoring (NILM). We advance and complement this important field of energy research through an innovative methodology that characterises the energy consumption of domestic life by making the linkages between appliance end-use and activities through an ontology built from qualitative data about the household and NILM data. We use activities as a descriptive term for the common ways households spend their time at home. These activities, such as cooking or laundering, are meaningful to householdsâ own lived experience. Thus, besides strictly technical algorithmic approaches for processing quantitative smart meter data, we also draw on social science time use approaches and interview and ethnography data. Our method disaggregates a households total electricity load down to appliance level and provides the start time, duration, and total electricity consumption for each occurrence of appliance usage. We then make inferences about activities occurring in the home by combining these disaggregated data with an ontology that formally specifies the relationships between electricity-using appliances and activities. We also propose two novel standardised metrics to enable easy quantifiable comparison within and across households of the energy intensity and routine of activities of interest. Finally, we demonstrate our results over a sample of ten households with an in-depth analysis of which activities can be inferred with the qualitative and quantitative data available for each household at any time, and the level of accuracy with which each activity can be inferred, unique to each household. This work has important applications from providing meaningful energy feedback to households to comparing the energy efficiency of householdsâ daily activities, and exploring the potential to shift the timing of activities for demand management
High-accuracy real-time microseismic analysis platform : case study based on the super-sauze mud-based landslide
Understanding the evolution of landslide and other subsurface processes via microseismic monitoring and analysis is of paramount importance in predicting or even avoiding an imminent slope failure (via an early warning system). Microseismic monitoring recordings are often continuous, noisy and consist of signals emitted by various sources. Automated analysis of landslide processes comprises detection, localization and classification of microseismic events (with magnitude <2 richter scale). Previous research has mainly focused on manually tuning signal processing methods for detecting and classifying microseismic signals based on the signal waveform and its spectrum, which is time-consuming especially for long-term monitoring and big datasets. This paper proposes an automatic analysis platform that performs event detection and classification, after suitable feature selection, in near realtime. The platform is evaluated using seismology data from the Super-Sauze mud-based landslide, which islocated in the southwestern French Alps, and features earthquake, slidequake and tremor type events
- âŚ